home

Editorial
Today's News
News Archives
On-line Articles
Current Issue
Current Abstracts
Magazine Archives
Subscribe to ISD


Directories:
Vendor Guide 2000
Advertiser Index
EDA Web Directory
Literature Guide
Event Calendar


Resources:
Resources and Seminars
Special Sections
High-tech Job Search


Information:
2001 Media Kit
About isdmag.com
Writers Wanted!
Search isdmag.com
Contact Us


Related Sites:
learnverilog.com




Focus Report: System Design Tools

System-level design tools are opening up a wealth of possibilities (and questions) for designers. By Steven E. Schulz


There's an old adage among engineers that goes something like this: "80 percent of your design is determined in the first 20 percent of the design flow". This saying has been underscored by a recent explosion in new EDA tools addressing the front-end of the design flow (electronic system level, or ESL). These new tools are opening up a wealth of possibilities to aid designers-but also a wealth of questions. Do these new tools really work? What input language(s) should I use? How does this fit with my existing flow? What is the right methodology for my application?

A new breed of large designs in silicon is pushing our aging RTL-to-gates methodology to the breaking point. This methodology is completely inadequate when tackling complex interactions with embedded software, analog, or other mixed implementation technologies (see sidebar, "A Day in the Life of an Engineer"). The same limitations tax electronic designers of aggregate systems, too.

We will explore this new frontier in EDA by offering a brief tutorial on system-level concepts, compare the current crop of system-level product offerings, gaze out into the future, and address a few of those tough questions.

Being obtuse

System-level design is an inherently vague label for a vague notion. Consequently, system-level methodologies often appear to be mysterious and vague, too. This sadly makes it easier for designers to put it off for one more day. The ESL tools we cover here are largely focused on a microelectronics-centric definition of a system, and in most cases are confined to digital hardware or software.

In fact, understanding vague systems and vague specifications is precisely the problem these ESL tools hope to solve. Without a clear specification at a higher level of abstraction, it is impossible to select the best architecture, synthesize valid implementations from the requirements, or properly estimate whether the design goals can be met at the end of the design cycle. It simply isn't possible to validate that which can't be specified. There is nothing vague about an ESL methodology: it focuses solidly on understanding behavior and relationships among the primary components of your design.

Abstraction may also seem fuzzy at first, however it is the most concise means by which to state overall design intent. One risk of trying to use low-level semantics in lieu of abstraction is that of over-constraining the design space, and over-complicating verification. The goals of starting at higher abstraction levels are to explore more of the design space, prune away poor choices faster, and focus detailed work exclusively on feasible options.

With the rapid move to finer process geometries accelerating the integration of processor cores, engineers found themselves caught short in handling embedded software as part of the system-on-a-chip (SOC) ASIC. Co-verification products have been a necessity, yet many quickly found a need for more. Today's VHDL and Verilog HDLs were not designed for co-design flows. Many of these tools attempt to help close the gap between hardware and embedded software co-design.

Do as I want, not as I say

Although much of the industry debate has centered on choice of input languages, these syntactic wars are a red herring. Far more important to design productivity and faithful prediction of system operation are the semantics, or meaning, represented underneath the syntax of the models. There are two primary categories of modeling concern: models of computation (MOC) and models of communication (Mocomm). Models of computation define how inputs to a model are interpreted for processing into outputs. Familiar examples include discrete event, cycle-based, synchronous data flow, or asynchronous control flow. Models of communication define how concurrent behaviors interact with one another, and can be categorized within the conceptual framework of blocking or non-blocking reads and writes. Examples include: interrupt register map, polling register map, shared memory, and semaphore protected. It's important to understand the semantics supported by your particular tool suite, as you may not be able to accurately model or verify all interactions of concern.

As with any emerging market, the industry has yet to converge on which input languages will (and will not) succeed. System-level tools face the additional challenges of diverging application and methodology needs, as well as a blurring of previously distinct usage scenarios. For example, the needs of telecommunication systems for efficient system description can be quite different than someone designing a PDA or new graphics chip. While hardware designers prefer to specify concurrency and time using their familiar HDL, software developers are committed to C, C++, or Java. What does one do when the application requires tradeoffs among multiple semantic domains?

Pure software languages lack inherent semantics for accurate hardware modeling, and neither Verilog nor VHDL offer sufficient semantic abstraction, let alone familiarity for software developers. Furthermore, no single language is capable of containing all desirable semantic features that frequently occur in complex heterogeneous systems.

Products based on either C or C++ utilize added semantic extensions to support the notions of concurrency and time. In C++, this is done using a class library approach. This specification of concurrent behavior must be made explicit by using the appropriate classes. The scheduling of concurrent behavior and signal assignments may either be done at compile time or during runtime. If performed at compile time, the scheduled order of execution will be invariant to input stimulus. Beware, as simulations can't highlight concurrency issues not specifically modeled. Furthermore, standard C/C++ software compilers cannot check for semantic errors they themselves do not recognize or understand (see SpecC and Superlog in tools section below).

Formalized languages, such as Specification and Description Language (SDL), have intrinsic benefits that include automatic test case generation and application of (formal) static verification. Telecom and aerospace designers have long embraced SDL, known for communication channel abstraction in both synchronous and asynchronous control flow event-driven semantics. SDL can also be coupled with Unified Modeling Language (UML), which can help as a very informal notation for top-level requirements. Many additional language examples exist, such as Esterel, Lustre, Haskell, and multiple variations of the original State Charts.

System Level Design Language (SLDL) is a worldwide project supported through Accellera (the recent merger of VHDL International and Open Verilog International), with the goal to provide an interoperable environment for the various semantics and languages required in system-level design. SLDL includes Rosetta, a new high-level constraint language, defined as an integral part of SLDL that will permit simulation and formal reasoning for multi-domain semantics. Rosetta uniquely offers broad support for heterogeneous design constraints across all popular semantic computational and communications domains. However, both SLDL and Rosetta are still in prototyping, and not expected to be available in commercial tools until at least mid-2001.

So much for theory, show me results

Within the last 12 months, a virtual explosion of system-level tools has been released into the EDA marketplace. Users have gone from practically no options to a confusing array of options, with lots of scary new ideas that appear to put familiar, proven methodologies at risk. As we will see, there is relatively little need for concern, other than the need to learn a few new tricks to take advantage of the productivity gains that await. Most of the new tools complement existing RTL synthesis and verification flows, and simply leverage a more powerful modeling style to help designers avoid wasting time downstream. The Focus Table delineates relevant features among the current crop of tools, some of which are discussed in the following paragraphs.

In the works for several years at Cadence Berkeley Labs, the Cierto Virtual Component Co-design tool suite (VCC) is the result of Cadence's "Felix" initiative. VCC's performance simulations inform the designer, at a very early design stage, whether the system timing of a given architecture is capable of satisfying target system performance requirements. VCC features a complete communication refinement technology that starts with untimed communications in application-specific terms, such as asynchronous transfer mode (ATM) cells, and can resolve mismatches that might exist between protocols and data types during code generation. Next, the communication synthesis engine automatically inserts the necessary logic to implement the chosen protocols between the behavioral blocks, regardless of whether those blocks are mapped to hardware or software.

VCC, like several other tools, exports software-mapped behavior into processor-specific code for the selected RTOS. In addition, the RTOS is configured with the appropriate interrupt handlers, counter and timer set up, and static schedulers. Cadence claims that only VCC permits users to continue to work at a consistent level of design abstraction, even after detailed implementation refinements have been performed. Cierto VCC is also distinguished by its ability to generate complete test benches for verifying downstream hardware/software implementations using popular co-verification tools.

Archimate from Arexsys allows high-level system modeling using C, SDL, and VHDL connected through a block diagram editor. From this description, engineers can perform functional simulation (using complementary tools), and then can perform structural partitioning and communication synthesis, binding of functional units to hardware and software, hardware synthesis, and software generation. Archimate's communication synthesis permits packet, bus, and user-defined protocols to be implemented using popular models of communication, such as shared memory, FIFO, or buffer schemes. Software targeting performs static scheduling to fix parallel task execution order, then maps software processes on to one or more microprocessors. Arexsys has technology sharing partnerships with Telelogic, Mentor Graphics, and Co-design Automation.

Cynapps offers a suite of tools centered around the use of C++ and their own (open source) class library for hardware semantics, called Cynlib. The Cynapps suite features a tool called Cynthesizer that translates fully elaborated C++/Cynlib code into RTL level Verilog, providing a path to the hardware synthesis flow. Cynthesizer largely preserves variable names, control flow structure, and even comments. The upcoming version of cynthesizer will add more behavioral-level synthesis features, such as scheduling and resource allocation. The Cynapps suite includes a direct path to hardware synthesis, import of RTL HDL code into the Cynlib environment for fast simulation, and a Verilog PLI interface.

Vast Systems Technology made its debut at DAC this year with Comet, a new hardware-software tool set for co-design. Comet claims complete timing accuracy of the system and that its virtual processor models (VPM) have the ability to run operating systems (with full applications) at 150 million instructions per second (MIPS). Comet includes many "what if" options, with architectural experimentation among hardware-software partitions, microprocessor models and configurations, operating system options, and performance/cost trade-offs. Comet's graphical interface allows for interactive single stepping from hardware to software and vice versa.

With arguably the largest installed base to date, Coware's N2C (napkin-to-chip) C-based design environment has been one of the drivers for platform-based design. N2C attempts to cleanly separate behavior from communication and function from architecture. Coware recently added multiple processor support for interface synthesis, added greater automation for virtual platform model creation, and further automates creation of system software views of the platform. N2C offers a variety of visualization tools to aid analysis, such as Gantt charts for temporal relationships between components, bar charts for processor loading, bus and memory performance, cash hits/misses, and statistical switching activity for power estimation.

Among the system-level start-ups, Co-Design Automation is among those which achieved high "buzz factor" at this year's DAC. Their new Superlog language, while not yet an open standard, has nonetheless been receiving high marks for significantly raising the level of hardware abstraction from Verilog, and blending in some of the best features of VHDL. Utilizing a technology called Cblend, their Systemsim logic simulator parses Superlog, SystemC, Verilog, C, and C++ for execution on an interpreted simulation kernel, with focused compilation of parallel instruction streams. Systemex provides a path to implementation by extracting HDL code design constraints from Superlog models. Co-Design Automation has also announced planned support for VHDL and SpecC.

While Synopsys has long had a solid dataflow-centric DSP co-design methodology in the form of Cossap, its recent partnership with Coware on the SystemC language has been highly visible and well marketed as an important new direction for the company. In June, Synopsys formally announced a new product suite supporting SystemC, Cocentric System Studio and Cocentric SystemC Compiler. Cocentric System Studio is a design entry and modeling environment that utilitzes block and state diagramming techniques using graphical abstractions to represent concurrency, reactive control, and dynamic switching concepts. The tool also supports polymorphic type parameters so that a model can be more easily re-instantiated in a different context. System Studio uses a hybrid simulator, combining static-scheduled and data-driven computational semantics, and the user may select which models will use each engine within the simulator. The Cocentric SystemC compiler integrates behavioral synthesis and RTL synthesis engines to transform SystemC descriptions into gate-level netlists, including support for resource sharing, multicycle/pipelined operations, FSM generation, and a new "MemWrap" graphical interface for mapping to specific types of memories. In addition, "Bcview" provides a graphical cross-linking capability back to source code.

C-Level Design focuses on C-based hardware generation with their System Compiler product, which outputs both VHDL and Verilog. Since the C-Level methodology uses only a "CycleC" coding style subset of ANSI C/C++, the model can be compiled with any popular compiler and requires no license to execute. Csim includes a discrete-event simulation engine, and provides the modeling and debugging environment, which helps to manage complex projects that span multiple abstractions, and clocking styles. Csim can run up to 250-1500 times faster than equivalent HDL-based simulators, but is also integrated with popular HDL simulators from Model Technology and Synopsys.

Y Explorations, Inc. (YXI) is offering their own suite of C-based system-level tools, however with a twist. Sporting the SpecC language developed by founding Prof. Daniel Gajski, YXI emphasizes the distinction of using a C-based language that preserves the hardware and abstraction semantics within the language, preserving the ability to catch semantic errors at compile-time, rather than discovering it by chance in simulation.

Explorations' Tool Suite includes XC, Explorations' compiler, XD reuse repository database, XE graphical environment, and XELL, a Tcl/Tk shell for advanced users. XC utilizes internal estimation algorithms to explore a mixed variety of reuse components, interface protocols, and architectures. XC analyzes the cost of various I/O protocols (for example, sync/async, FIFO, PCI), and maps original I/O such that all constraints are satisfied. Behavioral synthesis features, such as automatic cycle scheduling and resource sharing, are also integrated. The XD/XE methodology includes algorithms that aid the capture of reusable components and communication channels.

Bright future ahead

Given the rapid explosion of consumer electronics, telecommunications, multimedia, and pervasive computing trends, the future could not be better for system-level EDA tools. It's difficult to imagine a scenario in which aging methodologies are not creating unacceptable bottlenecks to design productivity. What remains to be accomplished, however, is wide-spread convergence on the generalized ESL methodology, and alignment on input languages and semantics, which will in turn permit the requisite interoperability for mass-market appeal.

Click here for this month's Focus Tables.


Steven E. Schulz is a senior member of the technical staff at Texas Instrument Inc.'s Worldwide ASIC division in Dallas. He serves on the board of directors of VHDL International and is the executive sponsor of the System-Level Design Language.

To voice an opinion on this or any other article in Integrated System Design, please e-mail your comments to sdean@cmp.com.


Send electronic versions of press releases to news@isdmag.com
For more information about isdmag.com e-mail webmaster@isdmag.com
Comments on our editorial are welcome.
Copyright © 2000 Integrated System Design Magazine

Sponsor Links

All material on this site Copyright © 2000 CMP Media Inc. All rights reserved.